skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 10:00 PM ET on Friday, February 6 until 10:00 AM ET on Saturday, February 7 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Snyder, Caitlin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Recently, there has been a surge in developing curricula and tools that integrate computing (C) into Science, Technology, Engineering, and Math (STEM) programs. These environments foster authentic problem-solving while facilitating students’ concurrent learning of STEM+C content. In our study, we analyzed students’ behaviors as they worked in pairs to create computational kinematics models of object motion. We derived a domain-specific metric from students’ collaborative dialogue that measured how they integrated science and computing concepts into their problem-solving tasks. Additionally, we computed social metrics such as equity and turn-taking based on the students’ dialogue. We identified and characterized students’ planning, enacting, monitoring, and reflecting behaviors as they worked together on their model construction tasks. This study in-vestigates the impact of students’ collaborative behaviors on their performance in STEM+C computational modeling tasks. By analyzing the relationships between group synergy, turn-taking, and equity measures with task performance, we provide insights into how these collaborative behaviors influence students’ ability to construct accurate models. Our findings underscore the importance of synergistic discourse for overall task success, particularly during the enactment, monitoring, and reflection phases. Conversely, variations in equity and turn-taking have a minimal impact on segment-level task performance. 
    more » « less
  2. Collaborative problem-solving (CPS) in STEM+C education involves cognitive coordination and emotional regulation during joint tasks. Prior research has examined discrete affective states in learning environments but less is known about how these emotions evolve over time and affect CPS behavior. This study investigates the temporal dynamics of five emotions—engagement, confusion, boredom, delight, and frustration—using Markov Chain analysis of data from high school pairs building computational models in the C2STEM environment. Emotional transitions aligned with cognitive processes, seen in interaction patterns like PLAY, ADJUST, and BUILD, to analyze affect during modeling. Results show that emotional trajectories closely relate to cognitive actions, including construction, simulation testing, and debugging. Transitions that maintained engagement linked to productive collaboration and stronger performance, while ongoing frustration and boredom indicated disengagement progress. 
    more » « less
  3. This paper explores the design of two types of pedagogical agents—teaching and peer—in a collaborative STEM+C learning environment, C2STEM, where high school students learn physics (kinematics) and computing by building computational models that simulate the motion of objects. Through in-depth case study interviews with teachers and students, we identify role-based features for these agents to support collaborative learning in open-ended STEM+C learning environments. We propose twelve design principles—four for teaching agents, four for peer agents, and four shared by both—contributing to foundational guidelines for developing agents that enhance collaborative learning through computational modeling. 
    more » « less
  4. AbstractRecent advances in generative artificial intelligence (AI) and multimodal learning analytics (MMLA) have allowed for new and creative ways of leveraging AI to support K12 students' collaborative learning in STEM+C domains. To date, there is little evidence of AI methods supporting students' collaboration in complex, open‐ended environments. AI systems are known to underperform humans in (1) interpreting students' emotions in learning contexts, (2) grasping the nuances of social interactions and (3) understanding domain‐specific information that was not well‐represented in the training data. As such, combined human and AI (ie, hybrid) approaches are needed to overcome the current limitations of AI systems. In this paper, we take a first step towards investigating how a human‐AI collaboration between teachers and researchers using an AI‐generated multimodal timeline can guide and support teachers' feedback while addressing students' STEM+C difficulties as they work collaboratively to build computational models and solve problems. In doing so, we present a framework characterizing the human component of our human‐AI partnership as a collaboration between teachers and researchers. To evaluate our approach, we present our timeline to a high school teacher and discuss the key insights gleaned from our discussions. Our case study analysis reveals the effectiveness of an iterative approach to using human‐AI collaboration to address students' STEM+C challenges: the teacher can use the AI‐generated timeline to guide formative feedback for students, and the researchers can leverage the teacher's feedback to help improve the multimodal timeline. Additionally, we characterize our findings with respect to two events of interest to the teacher: (1) when the students cross adifficulty threshold,and (2) thepoint of intervention, that is, when the teacher (or system) should intervene to provide effective feedback. It is important to note that the teacher explained that there should be a lag between (1) and (2) to give students a chance to resolve their own difficulties. Typically, such a lag is not implemented in computer‐based learning environments that provide feedback. Practitioner notesWhat is already known about this topicCollaborative, open‐ended learning environments enhance students' STEM+C conceptual understanding and practice, but they introduce additional complexities when students learn concepts spanning multiple domains.Recent advances in generative AI and MMLA allow for integrating multiple datastreams to derive holistic views of students' states, which can support more informed feedback mechanisms to address students' difficulties in complex STEM+C environments.Hybrid human‐AI approaches can help address collaborating students' STEM+C difficulties by combining the domain knowledge, emotional intelligence and social awareness of human experts with the general knowledge and efficiency of AI.What this paper addsWe extend a previous human‐AI collaboration framework using a hybrid intelligence approach to characterize the human component of the partnership as a researcher‐teacher partnership and present our approach as a teacher‐researcher‐AI collaboration.We adapt an AI‐generated multimodal timeline to actualize our human‐AI collaboration by pairing the timeline with videos of students encountering difficulties, engaging in active discussions with a high school teacher while watching the videos to discern the timeline's utility in the classroom.From our discussions with the teacher, we define two types ofinflection pointsto address students' STEM+C difficulties—thedifficulty thresholdand theintervention point—and discuss how thefeedback latency intervalseparating them can inform educator interventions.We discuss two ways in which our teacher‐researcher‐AI collaboration can help teachers support students encountering STEM+C difficulties: (1) teachers using the multimodal timeline to guide feedback for students, and (2) researchers using teachers' input to iteratively refine the multimodal timeline.Implications for practice and/or policyOur case study suggests that timeline gaps (ie, disengaged behaviour identified by off‐screen students, pauses in discourse and lulls in environment actions) are particularly important for identifying inflection points and formulating formative feedback.Human‐AI collaboration exists on a dynamic spectrum and requires varying degrees of human control and AI automation depending on the context of the learning task and students' work in the environment.Our analysis of this human‐AI collaboration using a multimodal timeline can be extended in the future to support students and teachers in additional ways, for example, designing pedagogical agents that interact directly with students, developing intervention and reflection tools for teachers, helping teachers craft daily lesson plans and aiding teachers and administrators in designing curricula. 
    more » « less
  5. The incorporation of technology into primary and secondary education has facilitated the creation of curricula that utilize computational tools for problem-solving. In Open-Ended Learning Environments (OELEs), students participate in learning-by- modeling activities that enhance their understanding of (Science, technology, engineering, and mathematics) STEM and computational concepts. This research presents an innovative multimodal emotion recognition approach that analyzes facial expressions and speech data to identify pertinent learning-centered emotions, such as engagement, delight, confusion, frustration, and boredom. Utilizing sophisticated machine learning algorithms, including High-Speed Face Emotion Recognition (HSEmotion) model for visual data and wav2vec 2.0 for auditory data, our method is refined with a modality verification step and a fusion layer for accurate emotion classification. The multimodal technique significantly increases emotion detection accuracy, with an overall accuracy of 87%, and an Fl -score of 84%. The study also correlates these emotions with model building strategies in collaborative settings, with statistical analyses indicating distinct emotional patterns associated with effective and ineffective strategy use for tasks model construction and debugging tasks. These findings underscore the role of adaptive learning environments in fostering students' emotional and cognitive development. 
    more » « less
  6. Clarke-Midura, J; Kollar, I; Gu, X; D’Angelo, C (Ed.)
    In collaborative problem-solving (CPS), students work together to solve problems using their collective knowledge and social interactions to understand the problem and progress towards a solution. This study focuses on how students engage in CPS while working in pairs in a STEM+C (Science, Technology, Engineering, Mathematics, and Computing) environment that involves open-ended computational modeling tasks. Specifically, we study how groups with different prior knowledge in physics and computing concepts differ in their information pooling and consensus-building behaviors. In addition, we examine how these differences impact the development of their shared understanding and learning. Our study consisted of a high school kinematics curriculum with 1D and 2D modeling tasks. Using an exploratory approach, we performed in-depth case studies to analyze the behaviors of groups with different prior knowledge distributions across these tasks. We identify effective information pooling and consensus-building behaviors in addition to difficulties students faced when developing a shared understanding of physics and computing concepts. 
    more » « less
  7. LLMs have demonstrated proficiency in contextualizing their outputs using human input, often matching or beating human-level performance on a variety of tasks. However, LLMs have not yet been used to characterize synergistic learning in students’ collaborative discourse. In this exploratory work, we take a first step towards adopting a human-in-the-loop prompt engineering approach with GPT-4-Turbo to summarize and categorize students’ synergistic learning during collaborative discourse. Our preliminary findings suggest GPT-4-Turbo may be able to characterize students’ synergistic learning in a manner comparable to humans and that our approach warrants further investigation. 
    more » « less
  8. Grieff, S. (Ed.)
    Recently there has been increased development of curriculum and tools that integrate computing (C) into Science, Technology, Engineering, and Math (STEM) learning environments. These environments serve as a catalyst for authentic collaborative problem-solving (CPS) and help students synergistically learn STEM+C content. In this work, we analyzed students’ collaborative problem-solving behaviors as they worked in pairs to construct computational models in kinematics. We leveraged social measures, such as equity and turn-taking, along with a domain-specific measure that quantifies the synergistic interleaving of science and computing concepts in the students’ dialogue to gain a deeper understanding of the relationship between students’ collaborative behaviors and their ability to complete a STEM+C computational modeling task. Our results extend past findings identifying the importance of synergistic dialogue and suggest that while equitable discourse is important for overall task success, fluctuations in equity and turn-taking at the segment level may not have an impact on segment-level task performance. To better understand students’ segment-level behaviors, we identified and characterized groups’ planning, enacting, and reflection behaviors along with monitoring processes they employed to check their progress as they constructed their models. Leveraging Markov Chain (MC) analysis, we identified differences in high- and low-performing groups’ transitions between these phases of students’ activities. We then compared the synergistic, turn-taking, and equity measures for these groups for each one of the MC model states to gain a deeper understanding of how these collaboration behaviors relate to their computational modeling performance. We believe that characterizing differences in collaborative problem-solving behaviors allows us to gain a better understanding of the difficulties students face as they work on their computational modeling tasks. 
    more » « less